Definition of Western United States
-
(noun) the region of the United States lying to the west of the Mississippi River
Synonyms of Western United States
Antonyms of Western United States
No Antonyms Found.
Homophones of Western United States
No Homophones Found.